Smooth Backfitting for Additive Modeling with Small Errors-in-Variables, with an Application to Additive Functional Regression for Multiple Predictor Functions
نویسندگان
چکیده
We study smooth backfitting when there are errors-in-variables, which is motivated by functional additive models for a functional regression model with a scalar response and multiple functional predictors that are additive in the functional principal components of the predictor processes. The development of a new smooth backfitting technique for the estimation of the additive component functions in functional additive models with multiple functional predictors requires to address the difficulty that the eigenfunctions and therefore the functional principal components of the predictor processes, which are the arguments of the proposed additive model, are unknown and need to be estimated from the data. The available estimated functional principal components contain an error that is small for large samples but nevertheless affects the estimation of the additive component functions. This error-in-variables situation requires to develop new asymptotic theory for smooth backfitting. Our analysis also pertains to general situations where one encounters errors in the predictors for an additive model, when the errors become smaller asymptotically. We also study the finite sample properties of the proposed method for the application in functional additive regression through a simulation study and a real data example. AMS 2010 subject classification: 62G08, 62G20.
منابع مشابه
Convex-constrained Sparse Additive Modeling and Its Extensions
Sparse additive modeling is a class of effective methods for performing high-dimensional nonparametric regression. In this work we show how shape constraints such as convexity/concavity and their extensions, can be integrated into additive models. The proposed sparse difference of convex additive models (SDCAM) can estimate most continuous functions without any a priori smoothness assumption. M...
متن کاملContinuously Additive Models for Nonlinear Functional Regression
We introduce continuously additive models, which can be motivated as extensions of additive regression models with vector predictors to the case of infinite-dimensional predictors. This approach provides a class of flexible functional nonlinear regression models, where random predictor curves are coupled with scalar responses. In continuously additive modeling, integrals taken over a smooth sur...
متن کاملA Simple Smooth Backfitting Method for Additive Models
In this paper a new smooth backfitting estimate is proposed for additive regression models. The estimate has the simple structure of Nadaraya–Watson smooth backfitting but at the same time achieves the oracle property of local linear smooth backfitting. Each component is estimated with the same asymptotic accuracy as if the other components were known. 1. Introduction. In additive models it is ...
متن کاملLASSO ISOtone for High Dimensional Additive Isotonic Regression
Additive isotonic regression attempts to determine the relationship between a multi-dimensional observation variable and a response, under the constraint that the estimate is the additive sum of univariate component effects that are monotonically increasing. In this article, we present a new method for such regression called LASSO Isotone (LISO). LISO adapts ideas from sparse linear modelling t...
متن کاملBandwidth Selection for Smooth Backfitting in Additive Models
The smooth backfitting introduced byMammen, Linton and Nielsen [Ann. Statist. 27 (1999) 1443–1490] is a promising technique to fit additive regression models and is known to achieve the oracle efficiency bound. In this paper, we propose and discuss three fully automated bandwidth selection methods for smooth backfitting in additive models. The first one is a penalized least squares approach whi...
متن کامل